Running head: LEARNING QUASIREGULARITY 1 THEORETICAL NOTE How Do PDP Models Learn Quasiregularity?

نویسندگان

  • Woojae Kim
  • Mark A. Pitt
  • Jay I. Myung
چکیده

Parallel Distributed Processing (PDP) models have had a profound impact on the study of cognition. One domain in which they have been particularly influential is quasiregular learning, in which mastery requires both learning regularities that capture the majority of the structure in the input plus learning exceptions that violate the regularities. How PDP models learn quasiregularity is still not well understood. Smalland large-scale analyses of a feedforward, three-layer network were carried out to address two fundamental issues about network functioning: how the model can learn both regularities and exceptions without sacrificing generalizability; and the nature of the hidden representation that makes this learning possible. Results show that capacity-limited learning pressures the network to form componential representations, which ensures good generalizability. Small and highly local perturbations of this representational system allow exceptions to be learned while minimally disrupting generalizability. Theoretical and methodological implications of the findings are discussed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

How do PDP models learn quasiregularity?

Parallel distributed processing (PDP) models have had a profound impact on the study of cognition. One domain in which they have been particularly influential is learning quasiregularity, in which mastery requires both learning regularities that capture the majority of the structure in the input plus learning exceptions that violate the regularities. How PDP models learn quasiregularity is stil...

متن کامل

The relation between pseudonormality and quasiregularity in constrained optimization

We consider optimization problems with equality, inequality, and abstract set constraints. We investigate the relations between various characteristics of the constraint set related to the existence of Lagrange multipliers. For problems with no abstract set constraint, the classical condition of quasiregularity provides the connecting link between the most common constraint qualifications and e...

متن کامل

Quasiregularity and rigorous diffusion of strong Hamiltonian chaos.

Exact results are derived concerning quasiregularity and diffusion of strong chaos on resonances of the sawtooth map. A chaotic ensemble of well-defined quasiregularity type (the sequence of resonances visited) is generally a fractal set whose main characteristics, the topological entropy and the Hausdorff dimension, are calculated exactly, under some conditions, using a symbolic dynamics. The ...

متن کامل

A Recurrent Neural Network that Learns to Count

Parallel distributed processing (PDP) architectures demonstrate a potentially radical alternative to the traditional theories of language processing that are based on serial computational models. However, learning complex structural relationships in temporal data presents a serious challenge to PDP systems. For example, automata theory dictates that processing strings from a context-free langua...

متن کامل

Challenging the widespread assumption that connectionism and distributed representations go hand-in-hand.

One of the central claims associated with the parallel distributed processing approach popularized by D.E. Rumelhart, J.L. McClelland and the PDP Research Group is that knowledge is coded in a distributed fashion. Localist representations within this perspective are widely rejected. It is important to note, however, that connectionist networks can learn localist representations and many connect...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013